Efficient Learning of Linear Perceptrons

نویسندگان

  • Shai Ben-David
  • Hans Ulrich Simon
چکیده

We consider the existence of efficient algorithms for learning the class of half-spaces in ~n in the agnostic learning model (Le., making no prior assumptions on the example-generating distribution). The resulting combinatorial problem finding the best agreement half-space over an input sample is NP hard to approximate to within some constant factor. We suggest a way to circumvent this theoretical bound by introducing a new measure of success for such algorithms. An algorithm is IL-margin successful if the agreement ratio of the half-space it outputs is as good as that of any half-space once training points that are inside the IL-margins of its separating hyper-plane are disregarded. We prove crisp computational complexity results with respect to this success measure: On one hand, for every positive IL, there exist efficient (poly-time) IL-margin successful learning algorithms. On the other hand, we prove that unless P=NP, there is no algorithm that runs in time polynomial in the sample size and in 1/ IL that is IL-margin successful for all IL> O.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Ensemble learning of linear perceptrons; Online learning theory

Abstract Within the framework of on-line learning, we study the generalization error of an ensemble learning machine learning from a linear teacher perceptron. The generalization error achieved by an ensemble of linear perceptrons having homogeneous or inhomogeneous initial weight vectors is precisely calculated at the thermodynamic limit of a large number of input elements and shows rich behav...

متن کامل

Local linear perceptrons for classification

A structure composed of local linear perceptrons for approximating global class discriminants is investigated. Such local linear models may be combined in a cooperative or competitive way. In the cooperative model, a weighted sum of the outputs of the local perceptrons is computed where the weight is a function of the distance between the input and the position of the local perceptron. In the c...

متن کامل

On Boolean Combinations of Definitive Classifiers

We consider the sample complexity of concept learning when we classify by using a fixed Boolean function of the outputs of a number of different classifiers. Here, we take into account the ‘margins’ of each of the constituent classifiers. A special case is that in which the constituent classifiers are linear threshold functions (or perceptrons) and the fixed Boolean function is the majority fun...

متن کامل

Nonlinear Classification using Ensemble of Linear Perceptrons

In this study we introduce a neural network ensemble composed of several linear perceptrons, to be used as a classifier that can rapidly be trained and effectively deals with nonlinear problems. Although each member of the ensemble can only deal with linear classification problems, through a competitive training mechanism, the ensemble is able to automatically allocate a part of the learning sp...

متن کامل

Learning in large linear perceptrons and why the thermodynamic limit is relevant to the real world

We present a new method for obtaining the response function 9 and its average G from which most of the properties of learning and generalization in linear perceptrons can be derived. We first rederive the known results for the 'thermodynamic limit' of infinite perceptron size N and show explicitly that 9 is self-averaging in this limit. We then discuss extensions of our method to more general l...

متن کامل

A New Identification Scheme Based on the Perceptrons Problem

Identification is a useful cryptographic tool. Since zero-knowledge theory appeared [3], several interactive identification schemes have been proposed (in particular Fiat-Shamir [2] and its variants [4, 6, 5], Schnorr [9]). These identifications are based on number theoretical problems. More recently, new schemes appeared with the peculiarity that they are more efficient from the computational ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000